5,346 research outputs found

    Average resistance of toroidal graphs

    Get PDF
    The average effective resistance of a graph is a relevant performance index in many applications, including distributed estimation and control of network systems. In this paper, we study how the average resistance depends on the graph topology and specifically on the dimension of the graph. We concentrate on dd-dimensional toroidal grids and we exploit the connection between resistance and Laplacian eigenvalues. Our analysis provides tight estimates of the average resistance, which are key to study its asymptotic behavior when the number of nodes grows to infinity. In dimension two, the average resistance diverges: in this case, we are able to capture its rate of growth when the sides of the grid grow at different rates. In higher dimensions, the average resistance is bounded uniformly in the number of nodes: in this case, we conjecture that its value is of order 1/d1/d for large dd. We prove this fact for hypercubes and when the side lengths go to infinity.Comment: 24 pages, 6 figures, to appear in SIAM Journal on Control and Optimization (SICON

    Limited benefit of cooperation in distributed relative localization

    Full text link
    Important applications in robotic and sensor networks require distributed algorithms to solve the so-called relative localization problem: a node-indexed vector has to be reconstructed from measurements of differences between neighbor nodes. In a recent note, we have studied the estimation error of a popular gradient descent algorithm showing that the mean square error has a minimum at a finite time, after which the performance worsens. This paper proposes a suitable modification of this algorithm incorporating more realistic "a priori" information on the position. The new algorithm presents a performance monotonically decreasing to the optimal one. Furthermore, we show that the optimal performance is approximated, up to a 1 + \eps factor, within a time which is independent of the graph and of the number of nodes. This convergence time is very much related to the minimum exhibited by the previous algorithm and both lead to the following conclusion: in the presence of noisy data, cooperation is only useful till a certain limit.Comment: 11 pages, 2 figures, submitted to conferenc

    Cost-aware caching: optimizing cache provisioning and object placement in ICN

    Full text link
    Caching is frequently used by Internet Service Providers as a viable technique to reduce the latency perceived by end users, while jointly offloading network traffic. While the cache hit-ratio is generally considered in the literature as the dominant performance metric for such type of systems, in this paper we argue that a critical missing piece has so far been neglected. Adopting a radically different perspective, in this paper we explicitly account for the cost of content retrieval, i.e. the cost associated to the external bandwidth needed by an ISP to retrieve the contents requested by its customers. Interestingly, we discover that classical cache provisioning techniques that maximize cache efficiency (i.e., the hit-ratio), lead to suboptimal solutions with higher overall cost. To show this mismatch, we propose two optimization models that either minimize the overall costs or maximize the hit-ratio, jointly providing cache sizing, object placement and path selection. We formulate a polynomial-time greedy algorithm to solve the two problems and analytically prove its optimality. We provide numerical results and show that significant cost savings are attainable via a cost-aware design

    Where are your Manners? Sharing Best Community Practices in the Web 2.0

    Get PDF
    The Web 2.0 fosters the creation of communities by offering users a wide array of social software tools. While the success of these tools is based on their ability to support different interaction patterns among users by imposing as few limitations as possible, the communities they support are not free of rules (just think about the posting rules in a community forum or the editing rules in a thematic wiki). In this paper we propose a framework for the sharing of best community practices in the form of a (potentially rule-based) annotation layer that can be integrated with existing Web 2.0 community tools (with specific focus on wikis). This solution is characterized by minimal intrusiveness and plays nicely within the open spirit of the Web 2.0 by providing users with behavioral hints rather than by enforcing the strict adherence to a set of rules.Comment: ACM symposium on Applied Computing, Honolulu : \'Etats-Unis d'Am\'erique (2009

    Spazia-HPP: Hybrid plug-in for small vehicle

    Get PDF
    This paper presents a novel concept, the Hybrid Power Pack (HPP), which consists of a hybridization kit for transforming small city cars, powered by an original diesel engine, into a parallel hybrid vehicle. The study was jointly conducted by the University of Rome “Sapienza” and the Enea Casaccia research center. The idea is to design a hybrid powertrain that can be installed in a typical microcar, which means that all systems and components will be influenced by the limited space available in the motor compartment of the vehicle. In this paper the details of the mechanical and electrical realization of the powertrain will be discussed and the simulation of a small city car equipped with HPP will be presented and the results discussed and analyzed. The hybrid system also includes the battery pack which is composed of twenty-four Li-ion cells made by EIG, connected in series. The storage system is controlled as regards the voltage and temperature by a Battery Management System (BMS). All the above components are connected and managed by a control unit. The HPP presented in this paper obtains a reduction in fuel consumption higher than 20%. The solution presented with the HPP with its management strategy and the addition of the “plug-in function” makes the hybrid vehicle suitable in terms of performance and consumption in every driving conditions. The ideal strategy behind the “plug-in function” could represent a guideline for further achievements and experimentations, because it offers a simple hardware layout and a real reduction in fuel consumption

    PULP-HD: Accelerating Brain-Inspired High-Dimensional Computing on a Parallel Ultra-Low Power Platform

    Full text link
    Computing with high-dimensional (HD) vectors, also referred to as hypervectors\textit{hypervectors}, is a brain-inspired alternative to computing with scalars. Key properties of HD computing include a well-defined set of arithmetic operations on hypervectors, generality, scalability, robustness, fast learning, and ubiquitous parallel operations. HD computing is about manipulating and comparing large patterns-binary hypervectors with 10,000 dimensions-making its efficient realization on minimalistic ultra-low-power platforms challenging. This paper describes HD computing's acceleration and its optimization of memory accesses and operations on a silicon prototype of the PULPv3 4-core platform (1.5mm2^2, 2mW), surpassing the state-of-the-art classification accuracy (on average 92.4%) with simultaneous 3.7×\times end-to-end speed-up and 2×\times energy saving compared to its single-core execution. We further explore the scalability of our accelerator by increasing the number of inputs and classification window on a new generation of the PULP architecture featuring bit-manipulation instruction extensions and larger number of 8 cores. These together enable a near ideal speed-up of 18.4×\times compared to the single-core PULPv3

    Patent Disclosure and R&D Competition in Pharmaceuticals.

    Get PDF
    The prominent role played by patents within the pharmaceutical domain is unquestionable. In this paper we take an unusual perspective and focus on a relatively neglected implication of patents: the effect of patent-induced information disclosure (of both successes and failures) on the dynamics of R&D and market competition. The study builds upon the combination of two large datasets, linking the information about patents to firm level data on R&D projects and their outcome. Two case studies in the fields of anti-inflammatory compounds and cancer research complement our analysis. We show the important role played by patent disclosure in shaping firms technological trajectories through the possibility of reciprocal monitoring in a context of parallel research efforts, and suggest the importance of enhancing the diffusion of information concerning failures, not only to avoid wasteful duplication of innovative efforts, but also as a tool for the identification of promising research trajectories. This paper is the result of the "R&D competition" research project carried out jointly with Adrian Towse and Martina Garau of the Office of Health Economics, London, UK. A preliminary draft of the paper has been presented to the DRUID Summer Conference 2006 (Copenhagen), and to the 11th ISS Conference (Sophia-Antipolis).

    Patent Disclosure and R&D Competition in Pharmaceuticals.

    Get PDF
    The prominent role played by patents within the pharmaceutical domain is unquestionable. In this paper we take an unusual perspective and focus on a relatively neglected implication of patents: the effect of patent-induced information disclosure (of both successes and failures) on the dynamics of R&D and market competition. The study builds upon the combination of two large datasets, linking the information about patents to firm level data on R&D projects and their outcome. Two case studies in the fields of anti-inflammatory compounds and cancer research complement our analysis. We show the important role played by patent disclosure in shaping firms technological trajectories through the possibility of reciprocal monitoring in a context of parallel research efforts, and suggest the importance of enhancing the diffusion of information concerning failures, not only to avoid wasteful duplication of innovative efforts, but also as a tool for the identification of promising research trajectories. This paper is the result of the "R&D competition" research project carried out jointly with Adrian Towse and Martina Garau of the Office of Health Economics, London, UK. A preliminary draft of the paper has been presented to the DRUID Summer Conference 2006 (Copenhagen), and to the 11th ISS Conference (Sophia-Antipolis).patent disclosure; innovation; r&d competition

    Role of perfusion machines in the setting of clinical liver transplantation. A qualitative systematic review

    Get PDF
    Growing enthusiasm around machine perfusion (MP) in clinical liver transplantation (LT) may be the preamble for standardized practice to expand the donors' pool. The present systematic review investigated all the liver transplantations performed using grafts treated with MP. A systematic review of 309 papers was performed. Eventually, 27 articles were enrolled for the study. A total number of 173 cases was reported. Only 12 cohort studies were identified: the remaining ones were case reports or case series. Hypothermic machine perfusion was performed in 102 (59.0%), normothermic machine perfusion in 65 (37.6%), and controlled oxygenated rewarming in the remaining 6 (3.4%) cases. Donor characteristics, evaluation of graft quality and end-points were not homogeneous among the studies. Overall, post-LT results were excellent, with 1.2 and 4.0% of patients experienced primary non-function and ischemic-type biliary lesions, respectively
    corecore